[DPO] Adding weighted preference optimization (WPO) #2141
+103
−19
We went looking everywhere, but couldn’t find those commits.
Sometimes commits can disappear after a force-push. Head back to the latest changes here.